Dreadful Research and the [Private] European Gambling Industry
The Coventry report on the prevalence of doping and match-fixing is laden with elementary mistakes, writes Declan Hill, here pictured at Play the Game 2011 in Cologne. Photo: Tine Harden.
20.12.2011
By Declan HillMy old supervisor, the great Diego Gambetta, once said, ‘Every gentleman needs to know statistics.’ Like much of my education I did not understand it at the time. However, I was fortunate enough to also be mentored in statistics by Johann Lambsdorff (the professor who founded Transparency International’s indexed list of perceived corrupt countries) and the brilliant Anthony Heath. Thanks to them and a number of other hard-working statistical teachers, I eventually understood Diego’s statement. It was not some mark of class attainment, like learning to pass the port or tie a bowtie, but rather it is the idea that every informed citizen needs to have numerical literacy. As the cult of numbers grows in our societies with statistics used to justify all kinds of claims, we all have to have the ability to discern the truth from the malarkey.
I thought with gratitude about my teachers and all of their painstaking efforts, when I was reading recently one of the most shoddy and ill-conceived pieces of research every to waste trees. You may have seen the report or the newspaper articles around it. Coventry University researchers have examined doping in sport and match-fixing. They compiled a ‘database’ (really a long list) of ‘cases’ of match-fixing and doping in sports. They compared the two and discovered that in their lists doping cases out-number fixing cases by approximately 96% to 3%. Then they claim that this research shows that that doping is a far more serious problem in sport than match-fixing.
A couple of points. First, I am quite prepared to believe that in certain sports (weight-lifting) and in certain countries (United States) their claims may be true. I am also willing to believe that in certain sports (football) and in certain countries (Thailand) that the opposite may be true. Two, we need good, credible research into sports corruption, whether it is doping or fixing. Sadly the Coventry report is neither good, nor credible. The researchers commit a number of errors that would make most high-school statisticians blush.
I will explain in a moment, how such a paper laden with elementary mistakes came to be published, but first a brief explanation of what is wrong with the report.
1) The mostly elementary error is that their database is incomplete. They miss a fair number of corruption cases. One example (but there are others) shows that they have neglected the China case. (They do mention one involving the corruption conviction of one of the highest-ranked FIFA referees in China, but that is merely a small indication of the problems in that country.) The case I am referring to is a very large and very important investigation where match-fixing in the Chinese football league was so bad – that the government stepped in declaring that there were so many corrupted matches that the sport had become ‘a national embarrassment’’. The Chinese police eventually arrested over 200 players, coaches, referees, team owners and league officials (including the President of the Chinese League). These kinds of numbers of both games and high-profile people indicate that this is not a unique case, but rather a culture of corruption that had entrenched itself in a particular sport in a particular country. Yet this example and a number of similar ones are not mentioned in the Coventry Report.
2) There is another significant data collection problem. There is a dedicated unit designed to identify drug cheats in sport – the World Anti-Doping Agency (WADA). The agency has a budget of tens-of –millions-of-dollars, a large staff and significant buy-in from many national governments and sports agencies (reluctant in some cases, but it is there). It has been in place for ten-years, so there is a well-developed path to identifying drug cheats in sport. On the other hand, there is no similar agency to fight against fixing in sport. If there were a similar agency, we could reasonably expect to see an increase in the number of cases of corruption reported.
3) Academics call it ‘sample bias’. This is where the collection of the data is unwittingly slanted in a particular way. So the authors declare that there is less match-fixing in Asia then in Europe. They make this astounding claim because there are fewer reports about fixing in Asia than in Europe. The real problem is that match-fixing is so routine that, in many Asian countries, it does not make it into international media. For example, there have been police investigations in Singapore and Malaysia of even high school sports events being fixed. Yet because there are such high-levels of corruption, these events are not widely-reported outside of those countries.
4) The authors do not seem to have thought of the very simple question – ‘How do you measure the prevalence of a deviant act?’ This is a very significant issue in statistics. In layman’s terms, if you study the rate of rape provided by the United Nations, the numbers would indicate that relatively peaceable Sweden has a far higher incidence of sexual assault than Liberia where there have been widespread war crimes and sexual repression. Of course, this is not true. What these statistics actually indicate is the reported rate of sexual assault. Reported is not actual and it can vary widely between countries – depending on social norms and police actions.
We can reasonably expect that match-fixing will be significantly under-reported, as in many jurisdictions there is no one looking for it and no system to detect it. Match-fixing can be an illegal activity connected to violent, criminal gangs. This factor gives participants a very strong motivation for not reporting match fixing.
5) I could go on but I will skip to the most serious error – the design of the survey itself. I think they may be comparing apples with oranges. Their list of corruption ‘cases’ is too vague. It makes no attempt to say how many fixed matches or corrupt individuals were involved in each of their ‘cases’. So in their list of fixing ‘cases’ there is one listed as ‘Europe’ – this is actually, a fourteen-month police investigation and a number of court trials by the superb Organized Crime Task Force in Bochum, Germany. Their work exposed approximately two-hundred-and-fifty fixed matches, involving hundreds of players, referees and club officials in nine different countries across Europe. In the Coventry database, all those matches and all those people are represented as only one ‘case’, rather than hundreds of individual examples of corruption. They repeat this mistake in a number of their other ‘cases’.
Because they do not list their sources, it is unclear if in their list of doping ‘cases’ they are speaking about individual athletes caught cheating. If this is so, then it is pure and simple inaccurate measurement. Even if it is not, how do you accurately contrast the number of events that may have been corrupted by doping or fixing – is all of the Tour de France implicated if one cyclist is caught or merely the winner? Is the Balco case equivalent to Bochum? If so, then how is it measured? By the number of corrupted athletes involved? None of these rudimentary questions seem to have been asked and the report is severely weakened by this lack of critical analysis.
6) Finally, the Coventry University report is not actually an academic paper. If it were, it would have to go through a peer review. There are few academic statistical commentators who would be as gentle in their criticism as I have been.
Why all the scorn for a relatively obscure piece of research? (First, full disclosure, I know the authors and am deeply disappointed that they chose to bring out such work). The reason that I take the time to analyse what I believe to be worthless nonsense is because of who funded it.
The entire report was brought out with the sponsorship of the European private gambling industry. To their credit, both the authors and gambling executives have disclosed this fact. However, the report is so full of elementary mistakes that it is an instant credibility-destroyer.
I know the European private gambling industry. I have many friends, sources and contacts there. Most of them are thoroughly decent people. However, I am genuinely perplexed by the attitude of some of their lobbyists. Why don’t they just state the obvious loudly, clearly and repeatedly – ‘We love sports, we hate fixing’. This is a golden opportunity for the European private bookmaking industry to point out that their own companies suffer if matches are fixed. At times the industry does do this, but it is often drowned out by a few of their lobbyists who go about making statements on how the incidence of fixing in sport is exaggerated. Presumably, in the future, they will use the Coventry Report to support their claims.
Contrast these efforts with all the actions and statements of the police and many sports officials (neither of whom is usually the most open when it comes to making claims about corruption): FIFA’s integrity unit claims that there are now 24 national police investigations into football corruption: around the world there have been arrests of hundreds of players, coaches, referees and sports officials: hundreds of matches are mentioned in court documents as having been fixed in Turkey, Hungary, Finland, Croatia, Greece, Italy, Germany, Belgium, Poland, Czech Republic, Malta, South Korea, Malaysia, Zimbabwe, Nigeria, and many other countries: the UEFA President Michel Platini has declared match-corruption to be the ‘number one threat’ to their sport (and all of this is in just football, it does not even mentioning the very public problems of fixing in Taiwanese Baseball, Japanese Sumo, international cricket or other sports).
So in my opinion, the Coventry Report does serve a useful purpose. It is so bad, it is so heavily supported by commercial interests outside of academia, that it is a great red-flag warning. If you are a politician or a sports official or a journalist and someone cites the research then you can immediately question their findings.
The comment piece was first published on Declan Hill's own blog.
Play the Game's earlier article on the report.
their ignorance index exceeds their numerical
errors. Using statistics to justify opinions is
treading into dangerous waters. It gives a false
sense of authority and legitimacy while
simultaneously misleading authorities, diverting
them from the real issues. Cases in point,
Chinese-Asian gambling, Croatian corruption,
and the Claudia Pechstein case. Coventry
should leave the difficult 2+2=4 equations to the
real experts like Dr. Klaas Faber and Mr. Declan Hill.